Bug fix/sda fabric devices pcg#409
Bug fix/sda fabric devices pcg#409madhansansel merged 5 commits intocisco-en-programmability:mainfrom
Conversation
…as per the new design changes. UTs also updated
…r descriptions and improved handling of empty queries
| - Used to narrow down which fabric sites and devices should be included in the generated YAML file. | ||
| - If no filters are provided, all fabric devices from all fabric sites in Cisco Catalyst Center will be retrieved. | ||
| type: dict | ||
| - Each list entry targets a specific fabric site and optionally narrows down by device IP or roles. |
There was a problem hiding this comment.
Can we add the AND and OR logic? you can read and update if required..
- Within a single entry, all specified filters are combined using AND
logic. Omitting a filter means no restriction on that attribute.
- Multiple entries are combined using OR logic, allowing retrieval from
different fabric sites in a single invocation.
| self.log( | ||
| f"Fabric site '{fabric_name}' not found in Cisco Catalyst Center. Skipping filter entry {filter_idx}.", | ||
| "WARNING", | ||
| ) |
There was a problem hiding this comment.
When a filter entry is skipped (fabric not found, device not found), the code logs a WARNING and continues. But the user never sees these warnings in the module result. Can we collect the skipped entries and including them in the final result['response'] message or result['warnings'] so the user knows which entries were silently dropped... Is it a good idea or overhead? Can you check with @DNACENSolutions / team and update?
I see that in many places we log with "WARNING" in log messages but not captured in final result..
There was a problem hiding this comment.
Will implement this for all the modules in a separate PR.
| device_info_map = self.get_device_list(device_list_params) | ||
| if not device_info_map or device_ip not in device_info_map: | ||
| self.log( | ||
| f"Device with IP '{device_ip}' not found in Cisco Catalyst Center. Skipping filter entry {filter_idx}.", |
| params_for_query["networkDeviceId"] = network_device_id | ||
| if not params_for_query: | ||
| self.log( | ||
| f"No valid filters provided for filter entry {filter_idx}, skipping.", |
| comp for comp in components_list if comp not in network_elements | ||
| ] | ||
| if invalid_components: | ||
| valid_components = list(network_elements.keys()) + ["component_list"] |
There was a problem hiding this comment.
Is it "component_list" or "components_list" ..
In line number 290, we are using components_list.. Can you check and update?
| if req_filter_spec.get("required", False) and req_filter_name not in component_filter: | ||
| invalid_filters.append( | ||
| "Component '{0}' filter entry {1}/{2} is missing required filter '{3}'".format( | ||
| component_name, index, len(component_filters), req_filter_name |
There was a problem hiding this comment.
Is it correct to say index, len(component_filters) ?
if len(component_filters) is 5, and index starts with 0.. then
0/5, 1/5, 2/5, 3/5, 4/5 .. which may mislead to users.. Can you please check? If index starts with 1 then its fine. otherwise .. index + 1, len(component_filters)
| for filter_name, filter_value in component_filter.items(): | ||
| self.log( | ||
| "Processing filter '{0}' in entry {1}/{2} for component '{3}': value={4}".format( | ||
| filter_name, index, len(component_filters), component_name, filter_value |
There was a problem hiding this comment.
same as like above.. we need to check the entire file..
| fabric_name: "Global/India/Bangalore" | ||
| device_roles: ["BORDER_NODE"] | ||
| - fabric_name: "Global/India/Bangalore" | ||
| device_roles: ["BORDER_NODE"] |
There was a problem hiding this comment.
I don't see an example for multiple fabric sites.. Can we add it?
# Example 8: Generate configuration for devices from multiple fabric sites
- name: Generate configuration from multiple fabric sites
hosts: dnac_servers
vars_files:
- credentials.yml
gather_facts: false
connection: local
tasks:
- name: Export fabric devices from two fabric sites
cisco.dnac.sda_fabric_devices_playbook_config_generator:
dnac_host: "{{ dnac_host }}"
dnac_port: "{{ dnac_port }}"
dnac_username: "{{ dnac_username }}"
dnac_password: "{{ dnac_password }}"
dnac_verify: "{{ dnac_verify }}"
dnac_debug: "{{ dnac_debug }}"
dnac_version: "{{ dnac_version }}"
dnac_log: true
dnac_log_level: DEBUG
dnac_log_append: false
dnac_log_file_path: "{{ dnac_log_file_path }}"
state: gathered
config:
component_specific_filters:
components_list: ["fabric_devices"]
fabric_devices:
- fabric_name: "Global/USA/SAN-JOSE"
device_roles: ["BORDER_NODE"]
- fabric_name: "Global/India/Bangalore"
device_roles: ["EDGE_NODE"]
| } | ||
| ] | ||
| } | ||
| }, |
There was a problem hiding this comment.
Shall we add a testcase
"filter_multi_fabric_sites_case_11": {
"component_specific_filters": {
"fabric_devices": [
{
"fabric_name": "Global/Site_India/Karnataka/Bangalore",
"device_roles": ["EDGE_NODE"]
},
{
"fabric_name": "Global/India/Telangana/Hyderabad/BLD_1",
"device_roles": ["CONTROL_PLANE_NODE"]
}
]
}
}
koderchit
left a comment
There was a problem hiding this comment.
Addressed all the review comments
| comp for comp in components_list if comp not in network_elements | ||
| ] | ||
| if invalid_components: | ||
| valid_components = list(network_elements.keys()) + ["component_list"] |
| if req_filter_spec.get("required", False) and req_filter_name not in component_filter: | ||
| invalid_filters.append( | ||
| "Component '{0}' filter entry {1}/{2} is missing required filter '{3}'".format( | ||
| component_name, index, len(component_filters), req_filter_name |
| for filter_name, filter_value in component_filter.items(): | ||
| self.log( | ||
| "Processing filter '{0}' in entry {1}/{2} for component '{3}': value={4}".format( | ||
| filter_name, index, len(component_filters), component_name, filter_value |
| self.log( | ||
| f"Fabric site '{fabric_name}' not found in Cisco Catalyst Center. Skipping filter entry {filter_idx}.", | ||
| "WARNING", | ||
| ) |
There was a problem hiding this comment.
Will implement this for all the modules in a separate PR.
2b999e8
into
cisco-en-programmability:main

Type of Change
fabric_devicesfilter must be a list of dictsSummary:
The
fabric_devicesfield undercomponent_specific_filterswas typed and handled as a single dictionary. Thebrownfield_helpervalidation layer enforces a list type for all component filters, so any playbook supplyingfabric_devicesas a dict would fail immediately with"Component 'fabric_devices' filters must be a list".Steps to Reproduce:
Observed Behavior:
Expected Behavior:
fabric_devicesshould be accepted as a list of filter dictionaries:Root Cause Analysis:
fabric_deviceswas declared astype: dictin the module DOCUMENTATION and processed via a single.get()call inget_fabric_devices_configuration, conflicting withvalidate_component_specific_filterswhich expects a list of dicts.Fix Details:
fabric_devicestype fromdicttolist / elements: dictin DOCUMENTATION.get_fabric_devices_configurationto iterate over the list of filter entries; invalid/not-found entries are skipped with a warning so other valid entries in the same list are still processed.playbooks/sda_fabric_devices_playbook_config_generator.yamlto use list syntax.tests/unit/modules/dnac/fixtures/sda_fabric_devices_playbook_config_generator.jsonto list format.Impact:
Breaking change for existing playbooks using
fabric_devicesas a dict. All consumers must update to list syntax. The new loop-based approach also enables multiple fabric site filters in a single module invocation.fabric_namerequired parameter not enforced infabric_devicesfilterSummary:
fabric_nameis documented asrequired: trueunderfabric_devicesincomponent_specific_filters, but the module executed without error when it was omitted, failing silently later at the API level.Steps to Reproduce:
Observed Behavior:
Module executes successfully with no validation error, then fails at the API level:
Expected Behavior:
Root Cause Analysis:
validate_component_specific_filtersinbrownfield_helper.pyonly iterated over filters that were present in the submitted entry. It never checked for filters marked"required": Truethat were absent.Fix Details:
Added a required-field check in
validate_component_specific_filtersbefore the per-filter validation loop. For each filter entry, the code now iteratesvalid_filters_for_componentand appends an error for any filter spec with"required": Truethat is missing from the entry.Impact:
Any
fabric_devicesfilter entry omittingfabric_nameis now rejected at validation time with a clear error message. The fix is in the sharedbrownfield_helper.pyand applies to all modules usingvalidate_component_specific_filters.Non-existent
fabric_namegenerates empty config instead of warningSummary:
When a
fabric_namethat does not exist in Cisco Catalyst Center is supplied, the module silently produced an empty configuration (fabric_devices: []) rather than reporting that no matching data was found.Steps to Reproduce:
Observed Behavior:
Expected Behavior:
Root Cause Analysis:
get_fabric_devices_configurationhad two code paths that returned a truthy value when no data was found:fabric_name), execution continued past an emptyfabric_devices_params_list_to_queryand returned a result with an empty list.{"fabric_devices": []}instead ofNone.Because
yaml_config_generatoronly skips components whencomponent_datais falsy, these truthy-but-empty returns were appended tofinal_config_listand written to the YAML file.Fix Details:
Three changes in
get_fabric_devices_configuration:Noneguard whenfabric_devices_params_list_to_queryis empty after processing all filter entries.return {"fabric_devices": []}(no devices found) toreturn None.Nonereturn guard aftermodify_parametersyields no transformed results.Impact:
Users providing an invalid or non-existent
fabric_namenow receive a clear status message instead of a misleading empty output file.component_listmissing from valid-components list in error messageSummary:
When an invalid key is provided under
component_specific_filters, the error message did not includecomponent_listin the valid-components hint. Sincecomponent_listis a valid and expected key, the error gave users an incomplete picture of accepted keys.Steps to Reproduce:
Observed Behavior:
Expected Behavior:
Root Cause Analysis:
validate_component_specific_filtersbuilt the valid-components hint usinglist(network_elements.keys()). Becausecomponent_listis extracted and handled separately before the component loop, it was never included innetwork_elementsand never appeared in the error message.Fix Details:
The valid-components list is now built as:
Impact:
Users who provide an invalid key under
component_specific_filtersnow seecomponent_listin the valid-components hint. The fix applies to all modules usingvalidate_component_specific_filters.Testing Done
Test cases covered:
fabric_nameonly,device_ip, single/multidevice_roles, all filters combined, explicitcomponents_list, andfile_mode: append).fabric_nameand is unaffected by the new required-field check.Checklist
Ansible Best Practices
ansible-vaultor environment variables)Documentation
Screenshots (if applicable)
N/A